The original film Westworld is a campy delight. Produced in 1973, it can’t help its Cheez-Whiz feel. But the story, written by Michael Crichton, has a gold-mine of a premise that was just begging for an upgrade. It was no surprise that Jonathan Nolan teamed up with HBO to produce a new series by the same name. And it is fun, fun, fun.
In both versions, “Westworld” is an Old West theme park for wealthy adults. For $40,000 a day, guests arrive to luxurious accommodations, put on cowboy attire, and enter an enormous outdoor arena full of androids (indistinguishable from humans) who have been programmed to respond to the guests as they might in a game. There is a saloon with prostitutes, local family ranches, bandits on the loose, and a seemingly endless frontier to explore. Guests are enticed by these androids—called hosts—to enter into a number of infinitely flexible storylines. They can go on a search for lost treasure. They can ride with the sheriff to hunt for an outlaw. Or they can just stay in town and see what happens. In every case, the guests can abuse the hosts at will without concern about being harmed in return. Hosts can be killed, guests cannot. At the end of the day the dead or damaged hosts are gathered up by a discreet clean-up squad, transported underground, and repaired, reprogrammed, and returned to service.
This is domestic sci-fi at its best. But as a fan of both the movie and the series, I couldn’t help but wonder: what would an alien civilization think about the West if all they saw was HBO’s Westworld? Here are my thoughts.
1. We like having our basest desires pandered to.
This is both a primary theme of Westworld and a deliberately ironic part of its appeal. HBO is known for pushing the boundaries when it comes to sex and violence, so the network is a fitting venue for a show that appeals to these impulses. Viewers are turned into voyeurs as guests fulfill their wildest fantasies: a night of drinking and whoring; a violent bar fight with quick draw action; a heroic rescue of the girl from savages. HBO knows its audience. In the first episode alone there are three bared-breast scenes. The characters often make remarks about how all the guests want to do is shoot and/or screw the hosts. For creator Dr. Robert Ford (Anthony Hopkins), gratifying these impulses is just fine. He unapologetically works to make the hosts seem more realistic so the guests enjoy the illusion even more. And so it might be for us, too, as we look on. We are implicated.
But Westworld is a smart twenty-first century production, not a vulgar peep show. And so our aliens would also quickly learn that we are also nervous about these cheaply-won thrills. While there may be nothing more American than the desire to enjoy the pleasures of sex and violence without taking any responsibility for our actions, there is still some leftover Calvinism in us somewhere. Deep down we know that habitually seeing other beings as existing solely for our amusement isn’t good for our souls. In the pilot, an older host, playing the role of Peter Abernathy (Louis Herthum), starts to glitch. Although he was supposed to have been wiped clean of previous programming, he reverts to some lines from the professor he had played in a prior storyline. Just before he is decommissioned and removed from the park he whispers into the ear of his “daughter”: “These violent delights have violent ends.” His daughter Delores (Evan Rachel Wood) is Westworld’s host protagonist and the first android who begins to gain self-consciousness. She doesn’t know what these words mean, but Nolan knows we know—or at least that we can Google with the best of them. They are from Shakespeare’s Romeo and Juliet, and are spoken by Friar Lawrence right after Romeo basically tells him that “I don’t care what happens to me as long as I get Juliet”:
These violent delights have violent
ends
And in their triumph die, like fire and powder,
Which, as they kiss, consume. The sweetest honey
Is loathsome in his own deliciousness
And in the taste confounds the appetite.
Therefore love moderately. Long love doth so.
Too swift arrives as tardy as too slow.
At the very least, Friar Lawrence is pleading for Romeo to be moderate, patient, and aware of the consequences of his actions. If you let your desires burn hot and fast they will burn you up with them. Abernathy is predicting the ultimate doom of the theme park, of course, but he is also issuing a warning against the game itself, its power to spiritually deform the guests. Just as honey that is too sweet “confounds the appetite,” so the cheap delights of Westworld will make it less possible to live well in the real world. As if to prove this point, Abernathy himself goes nuts with destructive impulses in part because he had for many years inhabited the role of professor-leader of a desert cult that had become cannibalistic (a clear nod to Cormac McCarthy’s infamous Judge Holden in Blood Meridian). If you lived out this role again and again and again, killing (nearly real) people, when would you reach the point of no return?
These violent delights have violent ends because it’s all fun and games until someone gets an eye poked out. It turns out there are a lot of ways to get your eye poked out in Westworld—and the supposedly insentient hosts are doing the poking. We all know what happened to the star-crossed lovers of Shakespeare’s play. Stay tuned, alien watchers.
2. We have become completely posthuman.
In her book How We Became Posthuman, N. Katherine Hayles explains that we became posthuman the moment we accepted the Turing test as the measure of sentience. The test is basically as follows: hidden from view, a machine and a human communicate with a human questioner. If the human questioner cannot tell the difference, the machine has achieved a measure of artificial intelligence. Westworld echoes this test right from the start. In the pilot’s opening scene, a programmer asks Dolores: “have you ever questioned the nature of your reality?” Hayles argues that our acceptance of this test means that we now define sentient life in terms of information patterns and data—not things like embodied experience.
This question of artificial intelligence and the measure of sentience is a core question in the HBO remake, and not the peripheral one it was in the 1973 version. The central character known only as the Man in Black (Ed Harris) becomes obsessed with figuring out the deeper levels of the game. Those deeper levels are signified by a maze, and the maze is meant to say something profound about the nature of consciousness itself (fully explored in “The Bicameral Mind,” the final episode of season one). It’s not spoiling much to reveal that Arnold, one of the original founders of the park, wanted the machines to gain consciousness and become independent and free, and ultimately to prevent the park from opening. He thought consciousness was a pyramid involving steps upward to the top. But he was wrong, he tells Dolores, because “consciousness isn’t a journey upward but a journey inward. Not a pyramid but a maze. Every choice could bring you closer to the center or send you spiraling to the edges, to madness. Do you understand what the center represents? Whose voice I’ve been wanting to you to hear?” The voice Arnold wants Dolores to hear is her own voice. This would mean she is alive. At this point I’ll let you decide whether what is said is actually profound or just sounds that way. (The Nolan brothers sometimes have this problem).
Regardless, the show’s premise requires that we see life as posthuman even as it questions the larger moral ramifications of this move. Ford’s speeches show him to be completely unbothered by a thoroughly posthuman world. “We’ve managed to slip evolution’s leash,” he says, seemingly delighted to become the next step in humanity’s control over the evolutionary process. He could be an android himself, of course, which would make him only the latest iteration of the productive and revelatory confusion between Dr. Frankenstein and his monster. When a younger guest, William, arrives and meets his first host (the beautiful flight-attendant type reminiscent of those in the original film), she urges him to ask what she knows he is wondering. So he asks, “Are you real?” And she responds: “Well, if you can’t tell, does it matter?”
The Nolan brothers have gone down the “what does it mean to be human” path many times before (Memento, Interstellar, Inception), and Westworld reveals why the question is increasingly relevant. What does accepting the terms of the Turing test to define life say about us? Do we think that it is just fine to see our lives technologically, as an effort to perfect our human existence and thereby become gods? Furthermore, if we accept that there is no important difference between human beings and AI, how will that impact how we take ethical responsibility for others? I love how the show is working out this question through the Ed Harris character. More on him below.
3. We fear losing control of technology.
At the beginning of the 1973 film, as guests are ushered into the park, a Siri-like voice assures them that “nothing can go wrong.” This means, of course, that everything will go wrong. Crichton’s original Westworld was defined by the well-traveled horror motif of machines going haywire. At least since Shelley’s Frankenstein, when men become like gods their creations are going to come for them. Yul Brynner, the steely-eyed robot gunslinger of this Westworld, is a clear model for Arnold Schwarzenegger as the Terminator. (Fascinatingly, Warner Brothers cast Schwarzenegger for a remake of Westworld in 2002, but he left the project to be governor of California and the film was never made).
HBO’s Westworld clearly picked up on this machines-gone-crazy motif in the pilot. When Abernathy glitches, he delivers a chilling speech to Ford, full of literary references.
Ford: What is your itinerary?
Abernathy: To meet my maker.
Ford: And what do you want to say to your maker?
Abernathy: By my most mechanical and dirty hand I shall have such revenges on
you both. I will do such things…what they are yet I know not, but they shall be
the terrors of the earth.
In true postmodern style, Abernathy splices together quotes from two different Shakespeare plays: Henry IV and King Lear. The effect is chilling. But after this first episode, the plot widens to indicate that this Westworld is going to be more about terror than it is about horror. The difference between terror and horror is subtle but important. Horror is turning around to face Yul Brynner as gunslinger burned to a crisp, his hand still reaching out to grab you. Terror is the slower build-up of more existential types of fear: what is it that we have done in this park? Are these machines alive? If they are, what does it mean to be alive? Am I one of Ford’s machines? How would I know it if I were?
Losing control of our machines has become such an important theme of our twenty-first century fiction precisely because it represents the inverse of the goal of technology, which is to gain maximum control over our environment. If the point of technology is to limit contingency, when contingency bites back, it means double the hurt.
4. We are worried that we might be here by accident.
What I love about Westworld is what I love about everything that the Nolan brothers have worked on: it’s all about the power and importance of story. For story itself is a key access point to the question we all face: do our lives have greater meaning provided by a creator, or are we just the result of an evolutionary accident?
The series keeps this question in view by the use of its introductory sequence. With the haunting theme music playing, we watch as a 3D-printer spins together sinews of a moving horse and its western rider, who is half flesh and half skeleton. Then we cut to two skeleton hands playing the theme music on the piano, only to have those hands lift off the keyboard with the keys still moving. Player pianos (pre-programmed to play music from perforated sheets) became increasingly popular at the end of the nineteenth century. In Westworld the image works constantly to foreground the question: are we making our own music, or are we pre-determined by genetic code? Are we really writing our own stories, or do we just think that we are? This question has become a twenty-first century staple in disciplines from neuroscience to philosophy to psychology. An increasing number of scientists and science popularizers, like Antonio Damasio and Daniel Dennett, are telling us that we aren’t writing our own stories because free will is an illusion. The power of Westworld derives from ethical urgency of precisely that question.
This is where the story of the Man in Black, the guest protagonist, comes back in. The Ed Harris character has been visiting the park for thirty years. He saved the company from bankruptcy. He has become increasingly obsessed with figuring out the game because he believes that it, unlike real life, has an end, a purpose. As he says to one of the hosts: “You know why this place beats the real world? The real world is just chaos. An accident. But in here every detail adds up to something.” The Man in Black wants a life of purpose. He wants his choices to mean something to someone, to work toward some meaningful end. What he doesn’t fully understand is that fiction is the only place where “every detail adds up to something”—which is precisely why we turn to it. When Ford intones that “the guests don’t come looking for who they are; they already know who they are. They’re here because they want of a glimpse of what they could be,” he is giving one of the main reasons why we read fiction. We read for the possibility of and hope for a redemptive shape to our lives. Not all fictions are equal to the task, however, and ritual exposure to some can be toxic. We need to be discerning readers. At their worst, novels are an empty escape into a fantasy world. But at their best, novels give us a picture of the risks and rewards of the everyday choices we make that are turning us into the people we are. We do well to heed their warnings.
But this theme park is not a good novel, and the Man in Black is not a discerning reader. By the end of season one he is clearly degenerating. Addicted to the choose-your-own-adventure escape the park provides, he has grown increasingly cynical, callous to others, and closed off to love. Ritual exposure to this game has damaged him. It has failed him as badly as any addiction to porno-violence can. It has not provided him with salutary ways to think about his own life. It has not provided him with a storyline he can actually enter, with virtues he can emulate. Instead, it has taught him to try to beat an ultimately meaningless game by destroying everything in his path. There is evidence at the end of season one that his regular visits to Westworld contributed to his marital troubles at home, and the ultimate suicide of his wife. These violent delights have violent ends. Stay tuned—season two is going to be all about violent ends.
After learning these four things about us from Westworld, an alien civilization might acquire a still more troubling bit of insight. Even two hundred years after Mary Shelley warned us about it, arrogant, spiritually-malformed white guys (or androids made in the image of the same?) are still running the show here on planet Earth. In episode eight, Dr. Ford quotes Frankenstein in a way that illustrates that he missed the point of the novel—unless he meant to quote it ironically, which remains to be seen. Having just killed off one of the human employees to cover up his larger actions, he intones: “One man’s life or death were but a small price to pay for the acquirement of the knowledge which I sought, for the dominion I should acquire.” If I were an alien, that might give me cause to give planet Earth a skip, at least for now.
Christina Bieber Lake holds the Clyde S. Kilby Professor of English at Wheaton College.